2,145 research outputs found

    Remotely Powered Reconfigurable Receiver for Extreme Environment Sensing Platforms

    Get PDF
    Wireless sensors connected in a local network offer revolutionary exploration capabilities, but the current solutions do not work in extreme environments of low temperatures (200K) and low to moderate radiation levels (<50 krad). These sensors (temperature, radiation, infrared, etc.) would need to operate outside the spacecraft/ lander and be totally independent of power from the spacecraft/lander. Flash memory field-programmable gate arrays (FPGAs) are being used as the main signal processing and protocol generation platform in a new receiver. Flash-based FPGAs have been shown to have at least 100 reduced standby power and 10 reduction operating power when compared to normal SRAM-based FPGA technology

    Radiation-Hardened Solid-State Drive

    Get PDF
    A method is provided for a radiationhardened (rad-hard) solid-state drive for space mission memory applications by combining rad-hard and commercial off-the-shelf (COTS) non-volatile memories (NVMs) into a hybrid architecture. The architecture is controlled by a rad-hard ASIC (application specific integrated circuit) or a FPGA (field programmable gate array). Specific error handling and data management protocols are developed for use in a rad-hard environment. The rad-hard memories are smaller in overall memory density, but are used to control and manage radiation-induced errors in the main, and much larger density, non-rad-hard COTS memory devices. Small amounts of rad-hard memory are used as error buffers and temporary caches for radiation-induced errors in the large COTS memories. The rad-hard ASIC/FPGA implements a variety of error-handling protocols to manage these radiation-induced errors. The large COTS memory is triplicated for protection, and CRC-based counters are calculated for sub-areas in each COTS NVM array. These counters are stored in the rad-hard non-volatile memory. Through monitoring, rewriting, regeneration, triplication, and long-term storage, radiation-induced errors in the large NV memory are managed. The rad-hard ASIC/FPGA also interfaces with the external computer buses

    Remotely Powered Reconfigurable Receiver for Extreme Sensing Platforms

    Get PDF
    Unmanned space programs are currently used to enable scientists to explore and research the furthest reaches of outer space. Systems and methods for low power communication devices in accordance with embodiments of the invention are disclosed, describing a wide variety of low power communication devices capable of remotely collecting, processing, and transmitting data from outer space in order to further mankind's goal of exploring the cosmos. Many embodiments of the invention include a Flash-based FPGA, an energy-harvesting power supply module, a sensor module, and a radio module. By utilizing technologies that withstand the harsh environment of outer space, more reliable low power communication devices can be deployed, enhancing the quality and longevity of the low power communication devices, enabling more data to be gathered and aiding in the exploration of outer space

    Cryogenic Quenching Process for Electronic Part Screening

    Get PDF
    The use of electronic parts at cryogenic temperatures (less than 100 C) for extreme environments is not well controlled or developed from a product quality and reliability point of view. This is in contrast to the very rigorous and well-documented procedures to qualify electronic parts for mission use in the 55 to 125 C temperature range. A similarly rigorous methodology for screening and evaluating electronic parts needs to be developed so that mission planners can expect the same level of high reliability performance for parts operated at cryogenic temperatures. A formal methodology for screening and qualifying electronic parts at cryogenic temperatures has been proposed. The methodology focuses on the base physics of failure of the devices at cryogenic temperatures. All electronic part reliability is based on the bathtub curve, high amounts of initial failures (infant mortals), a long period of normal use (random failures), and then an increasing number of failures (end of life). Unique to this is the development of custom screening procedures to eliminate early failures at cold temperatures. The ability to screen out defects will specifically impact reliability at cold temperatures. Cryogenic reliability is limited by electron trap creation in the oxide and defect sites at conductor interfaces. Non-uniform conduction processes due to process marginalities will be magnified at cryogenic temperatures. Carrier mobilities change by orders of magnitude at cryogenic temperatures, significantly enhancing the effects of electric field. Marginal contacts, impurities in oxides, and defects in conductor/conductor interfaces can all be magnified at low temperatures. The novelty is the use of an ultra-low temperature, short-duration quenching process for defect screening. The quenching process is designed to identify those defects that will precisely (and negatively) affect long-term, cryogenic part operation. This quenching process occurs at a temperature that is at least 25 C colder than the coldest expected operating temperature. This quenching process is the opposite of the standard burn-in procedure. Normal burn-in raises the temperature (and voltage) to activate quickly any possible manufacturing defects remaining in the device that were not already rejected at a functional test step. The proposed inverse burn-in or quenching process is custom-tailored to the electronic device being used. The doping profiles, materials, minimum dimensions, interfaces, and thermal expansion coefficients are all taken into account in determining the ramp rate, dwell time, and temperature

    Dynamic Adhesive Wettability of Wood

    Get PDF
    Adhesive wettability of wood is usually evaluated by contact angle measurement. Because of liquid penetration and spreading on the wood surface, the contact angle changes as a function of time. In this study, a wetting model was developed to describe the dynamic contact angle process in which a parameter (K) can be used to quantify the adhesive penetration and spreading during the adhesive wetting process. By applying the wetting model, the adhesive wettability of sapwood and heartwood of southern pine and Douglas-fir was studied. Liquid wettability along and across the wood grain direction was also compared. Two resin systems, polymeric diphenylmethane diisocyanate (PMDI) and phenol-formaldehyde (PF), were evaluated. It was learned from this study that the wetting model could accurately describe the dynamic adhesive wetting process on wood surfaces. Through applying this model, it is shown that PMDI resin exhibited a better wettability on wood than PF resin. The adhesive is more easily wetted along the grain direction than across the grain direction. Species and drop location have no significant effect on the spreading and penetration rate (K-value). However, the interaction term between species and resin type shows a significant effect for the K-value. PMDI exhibits a greater K-value on the Douglas-fir surface, while PF resin shows a greater K-value on the southern pine surface. Heartwood shows a lower instantaneous contact angle than sapwood. Douglas-fir has a greater instantaneous contact angle than southern pine. The effect of species on the equilibrium contact angle is strongly dependent on the location of the drop on the wood surface. The equilibrium contact angle of Douglas-fir is smaller than that of southern pine for sapwood, but is greater for heartwood

    An Evaluation of Analysis Methods to Eliminate the Effect of Density Variation in Property Comparisons of Wood Composites

    Get PDF
    The objective of this research was to evaluate commonly used data analysis methods in property comparisons of wood composites to eliminate the effect of the density variation among board test specimens and to suggest a more reasonable and robust method. The methods reviewed included average, specific strength, and analysis of covariance. The indicator variable method was also applied to the property comparison and compared to the other methods. The modulus of rupture of wood fiber/polymer fluff composites manufactured with different material combinations and press temperatures was tested in the experiment for evaluation of the different analysis methods. The results of this study indicated that the statistical analysis method employed was very important in the study of the physical and mechanical properties of wood composites. The specific strength method is limited to the analysis of strength comparison for the high density composites. The analysis of covariance can be applied to all the property comparisons for either high or low density composites in eliminating the density variation effect. However, error exists in the property comparison using the analysis of covariance method when the slopes of the regression lines of property vs. specific gravity (SG) are different for the different composites being tested. The indicator variable method is shown to be more reliable than the specific strength and analysis of covariance methods because it compares the linear regression lines of property vs. SG by testing both the intercept and slope based on the data in the whole specific gravity range of test specimens

    Precise delay measurement through combinatorial logic

    Get PDF
    A high resolution circuit and method for facilitating precise measurement of on-chip delays for FPGAs for reliability studies. The circuit embeds a pulse generator on an FPGA chip having one or more groups of LUTS (the "LUT delay chain"), also on-chip. The circuit also embeds a pulse width measurement circuit on-chip, and measures the duration of the generated pulse through the delay chain. The pulse width of the output pulse represents the delay through the delay chain without any I/O delay. The pulse width measurement circuit uses an additional asynchronous clock autonomous from the main clock and the FPGA propagation delay can be displayed on a hex display continuously for testing purposes

    Estimating Maximum Water Absorption of Wood Fiber/Polymer Fluff Composites

    Get PDF
    The objective of this study was to develop a model to estimate the maximum water absorption (MWA) of wood fiber/polymer fluff composites as a function of polymer fluff content and board density. Polymeric diphenylmethane diisocyanate (PMDI) resin bonded dry-process wood fiber/polymer fluff composites were used in this study. Six polymer fluff contents (0, 15, 30, 45, 60, and 100%) and four target oven-dry board densities in the range of 0.50-1.00 g/cm3 were studied. A water immersion test was conducted on these boards. The effect of irreversible thickness swelling after water immersion (TSi) on the estimation of the maximum water absorption was evaluated. It was shown that the irreversible thickness swelling had a quadratic relationship with polymer fluff content and a linear relationship with oven-dry board density. The TSi of the composites used in this study was in the range of only 0.04-4.20%, which was negligible in the estimation of maximum water absorption. The prediction of maximum water absorption from the MWA model developed in this study was over 95% accuracy for most of the specimens. The maximum water absorption had a linear relationship with the polymer fluff content and a reciprocal relationship with board density

    The Dark Energy Survey Data Management System

    Full text link
    The Dark Energy Survey collaboration will study cosmic acceleration with a 5000 deg2 griZY survey in the southern sky over 525 nights from 2011-2016. The DES data management (DESDM) system will be used to process and archive these data and the resulting science ready data products. The DESDM system consists of an integrated archive, a processing framework, an ensemble of astronomy codes and a data access framework. We are developing the DESDM system for operation in the high performance computing (HPC) environments at NCSA and Fermilab. Operating the DESDM system in an HPC environment offers both speed and flexibility. We will employ it for our regular nightly processing needs, and for more compute-intensive tasks such as large scale image coaddition campaigns, extraction of weak lensing shear from the full survey dataset, and massive seasonal reprocessing of the DES data. Data products will be available to the Collaboration and later to the public through a virtual-observatory compatible web portal. Our approach leverages investments in publicly available HPC systems, greatly reducing hardware and maintenance costs to the project, which must deploy and maintain only the storage, database platforms and orchestration and web portal nodes that are specific to DESDM. In Fall 2007, we tested the current DESDM system on both simulated and real survey data. We used Teragrid to process 10 simulated DES nights (3TB of raw data), ingesting and calibrating approximately 250 million objects into the DES Archive database. We also used DESDM to process and calibrate over 50 nights of survey data acquired with the Mosaic2 camera. Comparison to truth tables in the case of the simulated data and internal crosschecks in the case of the real data indicate that astrometric and photometric data quality is excellent.Comment: To be published in the proceedings of the SPIE conference on Astronomical Instrumentation (held in Marseille in June 2008). This preprint is made available with the permission of SPIE. Further information together with preprint containing full quality images is available at http://desweb.cosmology.uiuc.edu/wik
    corecore